85 research outputs found

    How Useful is Quantitative Risk Assessment?

    Get PDF
    This article discusses the use of Quantitative Risk Assessment (QRA) in decision-making regarding the safety of complex technological systems. The insights gained by QRA are compared with those from traditional safety methods and it is argued that the two approaches complement each other. It is argued that peer review is an essential part of the QRA process. The importance of risk-informed rather than risk-based decision-making is emphasized. Engineering insights derived from QRAs are always used in combination with traditional safety requirements and it is in this context that they should be reviewed and critiqued. Examples from applications in nuclear power, space systems, and an incinerator of chemical agents are given to demonstrate the practical benefits of QRA. Finally, several common criticisms raised against QRA are addressed

    Biomagnetic methodologies for the noninvasive investigations of the human brain (Magnobrain)

    Get PDF
    Magnetoencephalography (MEG) non-invasively infers the distribution of electric currents in the brain by measuring the magnetic fields they induce. Its superb spatial and temporal resolution provides a solid basis for the `functional imaging¿ of the brain provided it is integrated with other brain imaging techniques. MAGNOBRAIN is an applied research project that developed tools to integrate MEG with MRI and EEG. These include: (1) software for MEG oriented MRI feature extraction; (2) the Brain Data Base (BDB) which is a reference library of information on the brain used for more realistic and biologically meaningful functional localisations through MEG and EEG; and (3) a database of normative data (age and sex matched) for the interpretation of MEG. It is expected that these tools will evolve into a medical informatics environment that will aid the planning of neurosurgical operations as well as contribute to the exploration of mental function including the study of perception and cognition

    Quantitative functional failure analysis of a thermal-hydraulic passive system by means of bootstrapped Artificial Neural Networks

    No full text
    International audienceThe estimation of the functional failure probability of a thermal-hydraulic (T-H) passive system can be done by Monte Carlo (MC) sampling of the epistemic uncertainties affecting the system model and the numerical values of its parameters, followed by the computation of the system response by a mechanistic T-H code, for each sample. The computational effort associated to this approach can be prohibitive because a large number of lengthy T-H code simulations must be performed (one for each sample) for accurate quantification of the functional failure probability and the related statistics. In this paper, the computational burden is reduced by replacing the long-running, original T-H code by a fast-running, empirical regression model: in particular, an Artificial Neural Network (ANN) model is considered. It is constructed on the basis of a limited-size set of data representing examples of the input/output nonlinear relationships underlying the original T-H code; once the model is built, it is used for performing, in an acceptable computational time, the numerous system response calculations needed for an accurate failure probability estimation, uncertainty propagation and sensitivity analysis. The empirical approximation of the system response provided by the ANN model introduces an additional source of (model) uncertainty, which needs to be evaluated and accounted for. A bootstrapped ensemble of ANN regression models is here built for quantifying, in terms of confidence intervals, the (model) uncertainties associated with the estimates provided by the ANNs. For demonstration purposes, an application to the functional failure analysis of an emergency passive decay heat removal system in a simple steady-state model of a Gas-cooled Fast Reactor (GFR) is presented. The functional failure probability of the system is estimated together with global Sobol sensitivity indices. The bootstrapped ANN regression model built with low computational time on few (e.g., 100) data examples is shown capable of providing reliable (very near to the true values of the quantities of interest) and robust (the confidence intervals are satisfactorily narrow around the true values of the quantities of interest) point estimates

    Comparison of bootstrapped artificial neural networks and quadratic response surfaces for the estimation of the functional failure probability of a thermal-hydraulic passive system

    No full text
    International audienceIn this work, bootstrapped artificial neural network (ANN) and quadratic response surface (RS) empirical regression models are used as fast-running surrogates of a thermal-hydraulic (T-H) system code to reduce the computational burden associated with estimation of functional failure probability of a T-H passive system. The ANN and quadratic RS models are built on a few data representative of the input/output nonlinear relationships underlying the T-H code. Once built, these models are used for performing, in reasonable computational time, the numerous system response calculations required for failure probability estimation. A bootstrap of the regression models is implemented for quantifying, in terms of confidence intervals, the uncertainties associated with the estimates provided by ANNs and RSs. The alternative empirical models are compared on a case study of an emergency passive decay heat removal system of a gas-cooled fast reactor (GFR)

    Bulk Power Grid Risk Analysis: Ranking Infrastructure Elements According to their Risk Significance

    Get PDF
    Disruptions in the bulk power grid can result in very diverse consequences that include economic, social, physical, and psychological impacts. In addition, power outages do not affect all end-users of the system in the same manner. For these reasons, a risk analysis of bulk power systems requires more than determining the likelihood and magnitude of power outages; it must also include the diverse impacts power outages have on the users of the system. We propose a methodology for performing a risk analysis on the bulk power system. A power flow simulation model is used to determine the likelihood and extent of power outages when components within the system fail to perform their designed function. The consequences associated with these failures are determined by looking at the type and number of customers affected. Stakeholder input is used to evaluate the relative importance of these consequences. The methodology culminates with a ranking of each system component by its risk significance to the stakeholders. The analysis is performed for failures of infrastructure elements due to both random causes and malevolent acts

    Ranking the Risks from Multiple Hazards in a Small Community

    Get PDF
    Natural hazards, human-induced accidents, and malicious acts have caused great losses and disruptions to society. After September 11, 2001, critical infrastructure protection has become a national focus in the United States and is likely to remain one for the foreseeable future. Damage to our infrastructures and assets could be mitigated through pre-disaster planning and actions. We have developed a systematic methodology to assess and rank the risks from these multiple hazards in a community of 20,000 people. It is an interdisciplinary study that includes probabilistic risk assessment, decision analysis, and expert judgment. Scenarios are constructed to show how the initiating events evolve into undesirable consequences. A value tree, based on multi-attribute utility theory, is used to capture the decision maker’s preferences about the impacts on the infrastructures and other assets. The risks from random failures are ranked according to their Expected Performance Index, which is the product of frequency, probability, and consequence of a scenario. Risks from malicious acts are ranked according to their Performance Index as the frequency of attack is not available. A deliberative process is used to capture the factors that could not be addressed in the analysis and to scrutinize the results. This methodology provides a framework for the development of a risk-informed decision strategy. Although this study uses the Massachusetts Institute of Technology campus as a test-bed, it is a general methodology that could be used by other similar communities and municipalities

    Probabilistic Risk Assessment Procedures Guide for NASA Managers and Practitioners (Second Edition)

    Get PDF
    Probabilistic Risk Assessment (PRA) is a comprehensive, structured, and logical analysis method aimed at identifying and assessing risks in complex technological systems for the purpose of cost-effectively improving their safety and performance. NASA's objective is to better understand and effectively manage risk, and thus more effectively ensure mission and programmatic success, and to achieve and maintain high safety standards at NASA. NASA intends to use risk assessment in its programs and projects to support optimal management decision making for the improvement of safety and program performance. In addition to using quantitative/probabilistic risk assessment to improve safety and enhance the safety decision process, NASA has incorporated quantitative risk assessment into its system safety assessment process, which until now has relied primarily on a qualitative representation of risk. Also, NASA has recently adopted the Risk-Informed Decision Making (RIDM) process [1-1] as a valuable addition to supplement existing deterministic and experience-based engineering methods and tools. Over the years, NASA has been a leader in most of the technologies it has employed in its programs. One would think that PRA should be no exception. In fact, it would be natural for NASA to be a leader in PRA because, as a technology pioneer, NASA uses risk assessment and management implicitly or explicitly on a daily basis. NASA has probabilistic safety requirements (thresholds and goals) for crew transportation system missions to the International Space Station (ISS) [1-2]. NASA intends to have probabilistic requirements for any new human spaceflight transportation system acquisition. Methods to perform risk and reliability assessment in the early 1960s originated in U.S. aerospace and missile programs. Fault tree analysis (FTA) is an example. It would have been a reasonable extrapolation to expect that NASA would also become the world leader in the application of PRA. That was, however, not to happen. Early in the Apollo program, estimates of the probability for a successful roundtrip human mission to the moon yielded disappointingly low (and suspect) values and NASA became discouraged from further performing quantitative risk analyses until some two decades later when the methods were more refined, rigorous, and repeatable. Instead, NASA decided to rely primarily on the Hazard Analysis (HA) and Failure Modes and Effects Analysis (FMEA) methods for system safety assessment

    Oil prices, tourism income and economic growth: A structural VAR approach for European Mediterranean countries

    Get PDF
    In this study, a Structural VAR model is employed to investigate the relationship among oil price shocks, tourism variables and economic indicators in four European Mediterranean countries. In contrast with the current tourism literature, we distinguish between three oil price shocks, namely, supply-side, aggregate demand and oil specific demand shocks. Overall, our results indicate that oil specific demand shocks contemporaneously affect inflation and the tourism sector equity index, whereas these shocks do not seem to have any lagged effects. By contrast, aggregate demand oil price shocks exercise a lagged effect, either directly or indirectly, to tourism generated income and economic growth. The paper does not provide any evidence that supply-side shocks trigger any responses from the remaining variables. Results are important for tourism agents and policy makers, should they need to create hedging strategies against future oil price movements or plan for economic policy developments

    Label-free electrochemical impedance biosensor to detect human interleukin-8 in serum with sub-pg/ml sensitivity

    Get PDF
    Biosensors with high sensitivity and short time-to-result that are capable of detecting biomarkers in body fluids such as serum are an important prerequisite for early diagnostics in modern healthcare provision. Here, we report the development of an electrochemical impedance-based sensor for the detection in serum of human interleukin-8 (IL-8), a pro-angiogenic chemokine implicated in a wide range of inflammatory diseases. The sensor employs a small and robust synthetic non-antibody capture protein based on a cystatin scaffold that displays high affinity for human IL-8 with a KD of 35 ± 10 nM and excellent ligand specificity. The change in the phase of the electrochemical impedance from the serum baseline, ∆θ(ƒ), measured at 0.1 Hz, was used as the measure for quantifying IL-8 concentration in the fluid. Optimal sensor signal was observed after 15 min incubation, and the sensor exhibited a linear response versus logarithm of IL-8 concentration from 900 fg/ml to 900 ng/ml. A detection limit of around 90 fg/ml, which is significantly lower than the basal clinical levels of 5-10 pg/ml, was observed. Our results are significant for the development of point-of-care and early diagnostics where high sensitivity and short time-to-results are essential
    corecore